Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add filters

Database
Language
Document Type
Year range
1.
7th IEEE-EMBS Conference on Biomedical Engineering and Sciences, IECBES 2022 - Proceedings ; : 318-323, 2022.
Article in English | Scopus | ID: covidwho-2302133

ABSTRACT

During the COVID-19 outbreak, many healthcare workers (HCWs) have been infected because they failed to comply with the correct process of donning and doffing personal protective equipment (PPE). Based on this, we develop a gesture-controlled system that not only can train HCWs but also can give HCWs real-time guidance during the process of donning and doffing PPE. It can effectively prevent the infection of HCWs. We first use the hand detection algorithm to locate the position of the HCWs, helping them to enter the proper area. Then they can use our gesture recognition algorithm to control the playback of the videos which guides them in donning and doffing PPE. We verify the effectiveness of the system through a series of experiments. The results show the great value of our system in the protection of HCWs. © 2022 IEEE.

2.
Intelligent Systems with Applications ; 17, 2023.
Article in English | Scopus | ID: covidwho-2238890

ABSTRACT

In April 2020, by the start of isolation all around the world to counter the spread of COVID-19, an increase in violence against women and kids has been observed such that it has been named The Shadow Pandemic. To fight against this phenomenon, a Canadian foundation proposed the "Signal for Help” gesture to help people in danger to alert others of being in danger, discreetly. Soon, this gesture became famous among people all around the world, and even after COVID-19 isolation, it has been used in public places to alert them of being in danger and abused. However, the problem is that the signal works if people recognize it and know what it means. To address this challenge, we present a workflow for real-time detection of "Signal for Help” based on two lightweight CNN architectures, dedicated to hand palm detection and hand gesture classification, respectively. Moreover, due to the lack of a "Signal for Help” dataset, we create the first video dataset representing the "Signal for Help” hand gesture for detection and classification applications which includes 200 videos. While the hand-detection task is based on a pre-trained network, the classifying network is trained using the publicly available Jesture dataset, including 27 classes, and fine-tuned with the "Signal for Help” dataset through transfer learning. The proposed platform shows an accuracy of 91.25% with a video processing capability of 16 fps executed on a machine with an Intel i9-9900K@3.6 GHz CPU, 31.2 GB memory, and NVIDIA GeForce RTX 2080 Ti GPU, while it reaches 6 fps when running on Jetson Nano NVIDIA developer kit as an embedded platform. The high performance and small model size of the proposed approach ensure great suitability for resource-limited devices and embedded applications which has been confirmed by implementing the developed framework on the Jetson Nano Developer Kit. A comparison between the developed framework and the state-of-the-art hand detection and classification models shows a negligible reduction in the validation accuracy, around 3%, while the proposed model required 4 times fewer resources for implementation, and inference has a speedup of about 50% on Jetson Nano platform, which make it highly suitable for embedded systems. The developed platform as well as the created dataset are publicly available. © 2022

SELECTION OF CITATIONS
SEARCH DETAIL